Semantic Compositionality through Recursive Matrix-Vector Spaces

نویسندگان

  • Richard Socher
  • Brody Huval
  • Christopher D. Manning
  • Andrew Y. Ng
چکیده

Single-word vector space models have been very successful at learning lexical information. However, they cannot capture the compositional meaning of longer phrases, preventing them from a deeper understanding of language. We introduce a recursive neural network (RNN) model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length. Our model assigns a vector and a matrix to every node in a parse tree: the vector captures the inherent meaning of the constituent, while the matrix captures how it changes the meaning of neighboring words or phrases. This matrix-vector RNN can learn the meaning of operators in propositional logic and natural language. The model obtains state of the art performance on three different experiments: predicting fine-grained sentiment distributions of adverb-adjective pairs; classifying sentiment labels of movie reviews and classifying semantic relationships such as cause-effect or topic-message between nouns using the syntactic path between them.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-Linear Similarity Learning for Compositionality

Many NLP applications rely on the existence of similarity measures over text data. Although word vector space models provide good similarity measures between words, phrasal and sentential similarities derived from composition of individual words remain as a difficult problem. In this paper, we propose a new method of of non-linear similarity learning for semantic compositionality. In this metho...

متن کامل

Vector Space Modelling of Natural Language

Vector space modelling of words has been a focus of research in the field of computational linguistics over the past decade. Aim of Vector Space Models is to project the words in a text corpus onto a vector space, such that the semantically similar words lie close together in the vector space (also called, semantic space). Recently, the research focus has shifted towards semantic compositionali...

متن کامل

Recursive Nested Neural Network for Sentiment Analysis

Early sentiment prediction systems use semantic vector representation of words to express longer phrases and sentences. These methods proved to have a poor performance, since they are not considering the compositionality in language. Recently many richer models has been proposed to understand the compositionality in natural language for better sentiment predictions. Most of these algorithms are...

متن کامل

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

Semantic word spaces have been very useful but cannot express the meaning of longer phrases in a principled way. Further progress towards understanding compositionality in tasks such as sentiment detection requires richer supervised training and evaluation resources and more powerful models of composition. To remedy this, we introduce a Sentiment Treebank. It includes fine grained sentiment lab...

متن کامل

Detecting Compositionality of Multi-Word Expressions using Nearest Neighbours in Vector Space Models

We present a novel unsupervised approach to detecting the compositionality of multi-word expressions. We compute the compositionality of a phrase through substituting the constituent words with their “neighbours” in a semantic vector space and averaging over the distance between the original phrase and the substituted neighbour phrases. Several methods of obtaining neighbours are presented. The...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012